A B C D E F G H I J K L M N O P Q R S T U V W X Y Z All
Victor, S. P.
- Algorithms to Find Geodetic Numbers and Edge Geodetic Numbers in Graphs
Authors
1 Department of Computer Applications, SRM University, Kattankulathur – 603 203, Tamilnadu, IN
2 Research Department of Computer Science, St.Xavier's College (Autonomous), Palayamkottai – 627002, Tamilnadu, IN
Source
Indian Journal of Science and Technology, Vol 8, No 13 (2015), Pagination:Abstract
For any two vertices u and v of a graph G = (V, E), any shortest path joining u and v is called a u-v geodesic. Closed interval I[u,v] of u and v is the set of those vertices belonging to at least one u-v geodesic. A subset S of V(G) is an geodetic set if every vertex of G lies in at least one closed interval between the vertices of S. The geodetic set of a minimum cardinality in G is called as minimum geodetic set. The cardinality of the minimum geodetic set is the geodetic number of G denoted by gn(G). For a non-trivial connected graph G, a set S ofV (G) is called an edge geodetic cover of G if every edge of G is contained in a geodesic joining some pair of vertices in S. The edge geodetic number egn (G) of G is the minimum order of its edge geodetic covers and any edge geodetic cover of order egn(G) is an edge geodetic basis. This paper introduces the algorithms to find geodetic numbers and edge geodetic numbers in connected graphs using dynamic programming approach.Keywords
Diameter, Distance, Eccentricity, Edge Geodetic Cover, Edge Geodetic Number, Geodesic, Geodetic Number, Geodetic Set, Radius- A Comparative Analysis of Single and Combination Feature Extraction Techniques for Detecting Cervical Cancer Lesions
Authors
1 Centre for Information Technology & Engineering, Manonmaniam Sundaranar University, IN
2 Department of Computer Science, St. Xavier's College, Palayamkottai, IN
Source
ICTACT Journal on Image and Video Processing, Vol 6, No 3 (2016), Pagination: 1167-1173Abstract
Cervical cancer is the third most common form of cancer affecting women especially in third world countries. The predominant reason for such alarming rate of death is primarily due to lack of awareness and proper health care. As they say, prevention is better than cure, a better strategy has to be put in place to screen a large number of women so that an early diagnosis can help in saving their lives. One such strategy is to implement an automated system. For an automated system to function properly a proper set of features have to be extracted so that the cancer cell can be detected efficiently. In this paper we compare the performances of detecting a cancer cell using a single feature versus a combination feature set technique to see which will suit the automated system in terms of higher detection rate. For this each cell is segmented using multiscale morphological watershed segmentation technique and a series of features are extracted. This process is performed on 967 images and the data extracted is subjected to data mining techniques to determine which feature is best for which stage of cancer. The results thus obtained clearly show a higher percentage of success for combination feature set with 100% accurate detection rate.Keywords
Cervical Cancer, Feature Extraction, Texture Features, Content Based Image Retrieval.- Comparative Evaluation of Contourlet and Wavelet Transform for Feature Extraction in Glaucoma Images
Authors
1 Department of Computer Applications, Noorul Islam Center for higher Education, Kumaracoil. Kanyakumari – 629180, Tamil Nadu, IN
2 Computer Science Department, St. Xavier’s College (Autonomous), Palaymkottai – 627007, Tamil Nadu, IQ
Source
Indian Journal of Science and Technology, Vol 9, No 13 (2016), Pagination:Abstract
Background/Objectives: In this paper our proposed system easily detects the glaucoma affected eye from the fundus image database collected from the nearest eye hospital. Methods/Statistical Analysis: The feature extraction is done by contourlet transform and the best feature is selected for classification. Support vector machine is mainly applied for classification of images. Findings: In the conventional methods Wavelet Transform is applied for feature extraction of images and Support Vector Machine (SVM) adapted for classifying the Glaucoma images with non-affected and affected. The accuracy of classification is evaluated using existing and proposed techniques. Applications/Improvements: The proposed system is applied to find the Glaucoma disease of human eye with accurately which eliminates the human error to examine the disease of human eye. The system automatically finds the disease of human eye within the seconds from the applied fundus image.Keywords
Feature Extraction in Glaucoma Images using Wavelet and Contourlet Transform- Comparative Study on Various Types of Filters
Authors
1 PSG College of Arts and Science, Coimbatore, Tamil Nadu 641014, IN
2 Department of Computer Science, St. Xavier's College (Autonomous), Palayamkottai, Tirunelveli-627 002, IN
Source
Digital Image Processing, Vol 10, No 3 (2018), Pagination: 48-49Abstract
In this paper, we are going to discuss four main filters such as mean filter, Gaussian filter, median filter and spatial filter their characteristics, outputs and execution time are compared. So that user can select any filter by knowing its characteristics and their performance.Keywords
Kernel, Sliding Window, Elapsed Time, Gaussian Distribution.- Big Data Hadoop:A Survey on Security Issues, Challenges and Solutions
Authors
1 Manonmaniam Sundaranar University, Tirunelveli, TN, IN
2 Sree Sastha Institute of Engineering and Technology, Chennai, TN, IN
3 St. Xavier’s College, Palayamkottai, TN, IN
Source
Data Mining and Knowledge Engineering, Vol 10, No 2 (2018), Pagination: 25-27Abstract
Big data is widening its popularity among data handlers. Hadoop has its own security measures as the top priority. But still it can be augmented with additional security features in the areas where we can identify threats and improvise its security features, so that authentic users can feel better safety on their data. Because data that come from heterogeneous sources have their importance felt at their originating point, under idealistic circumstances, we expect full privacy and security will be ascertained on our data while it is being stored and maintained in common storage. In order to provide such a kind of satisfaction, in this paper we try to identify the areas where security can further be enhanced. Also, we discuss with the threats or challenges that may be imposed in this move. For such situations we suggest remedial counter actions to get out of them.
Keywords
Big Data, Security, Privacy, Hadoop, Map Reduce.- Overview of Big Data Analytics System for Storing and Processing Huge Data
Authors
1 Manonmaniam Sundaranar University, Tirunelveli, TN, IN
2 Sree Sastha Institute of Engineering and Technology, Chennai, TN, IN
3 St. Xavier’s College, Palayamkottai, TN, IN
Source
Data Mining and Knowledge Engineering, Vol 10, No 2 (2018), Pagination: 28-30Abstract
Big data uses storage of huge data with some approaches and techniques to manage and process them. During the past few years the number of persons using internet, email and other internet based applications have been growing tremendously. Big Data is mainly characterized by Volume, Velocity and, Variety. The Big Data Architecture Framework (BDAF) is proposed to address all aspects of the Big Data Ecosystem and includes the following components: Big Data Infrastructure, Big Data Analytics, Data structures and models, Big Data Lifecycle Management, Big Data Security. The volume of data used is increasing exponentially. So, the need for storing, processing and protecting large volume of data has been becoming a great challenge in the modern hyper-connected world. Thousands of software professionals and others are doing their jobs with their internet connected laptops and mobile phones on the basis of work from home concept for development, implementation, testing and maintenance of various applications. These professionals and experts are sending and receiving lot of data to their clients, higher authorities and other officials on daily or weekly or other requirement basis. The traditional data management models are not efficient in Big data analytics. In this paper we try to give an overview of Big Data Analytics system for storing and processing huge data.